skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Cotter, Kelley"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The growing ubiquity of algorithms in everyday life has prompted cross-disciplinary interest in what people know about algorithms. The purpose of this article is to build on this growing literature by highlighting a particular way of knowing algorithms evident in past work, but, as yet, not clearly explicated. Specifically, I conceptualize practical knowledge of algorithms to capture knowledge located at the intersection of practice and discourse. Rather than knowing that an algorithm is/does X, Y, or Z, practical knowledge entails knowing how to accomplish X, Y, or Z within algorithmically mediated spaces as guided by the discursive features of one’s social world. I conceptualize practical knowledge in conversation with past work on algorithmic knowledge and theories of knowing, and as empirically grounded in a case study of a leftist online community known as “BreadTube.” 
    more » « less
  2. Efforts to govern algorithms have centerd the ‘black box problem,’ or the opacity of algorithms resulting from corporate secrecy and technical complexity. In this article, I conceptualize a related and equally fundamental challenge for governance efforts: black box gaslighting. Black box gaslighting captures how platforms may leverage perceptions of their epistemic authority on their algorithms to undermine users’ confidence in what they know about algorithms and destabilize credible criticism. I explicate the concept of black box gaslighting through a case study of the ‘shadowbanning’ dispute within the Instagram influencer community, drawing on interviews with influencers (n = 17) and online discourse materials (e.g., social media posts, blog posts, videos, etc.). I argue that black box gaslighting presents a formidable deterrent for those seeking accountability: an epistemic contest over the legitimacy of critiques in which platforms hold the upper hand. At the same time, I suggest we must be mindful of the partial nature of platforms’ claim to ‘the truth,’ as well as the value of user understandings of algorithms. 
    more » « less